Equivalent error bars for neural network classifiers trained by Bayesian inference

نویسنده

  • Peter Sykacek
چکیده

The topic of this paper is the problem of outlier detection for neural networks trained by Bayesian inference. I will show that marginalization is not a good method to get moderated probabilities for classes in outlying regions. The reason why marginalization fails to indicate outliers is analysed and an alternative measure, that is a more reliable indicator for outliers, is proposed. A simple artiicial classiication problem is used to visualize the diierences. Finally both methods are used to classify a real world problem, where outlier detection is mandatory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Why Error Measures are Sub-Optimal for Training Neural Network Pattern Classifiers

Pattern classifiers that are trained in a supervised fashion (e.g., multi-layer perceptrons, radial basis functions, etc.) are typically trained with an error measure objective function such as mean-squared error (MSE) or cross-entropy (CE). These classifiers can in theory yield (optimal) Bayesian discrimination, but in practice they often fail to do so. We explain why this happens. In so doing...

متن کامل

Bayesian Neural Networks for Survival Analysis: a Comparative Study

This paper describes two neural network models aimed at survival analysis modelling. These models are based on formulations of the survival analysis problem in continuous and discrete time. Both models are described in a Bayesian inference framework to increase their robustness and to reduce the risk of overfitting. We test the models on real data, to predict survival from intraocular melanoma,...

متن کامل

Deep Neural Networks as Gaussian Processes

A deep fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP) in the limit of infinite network width. This correspondence enables exact Bayesian inference for neural networks on regression tasks by means of straightforward matrix computations. For single hiddenlayer networks, the covariance function of this GP has long been known. Recent...

متن کامل

Learning Document Image Features With SqueezeNet Convolutional Neural Network

The classification of various document images is considered an important step towards building a modern digital library or office automation system. Convolutional Neural Network (CNN) classifiers trained with backpropagation are considered to be the current state of the art model for this task. However, there are two major drawbacks for these classifiers: the huge computational power demand for...

متن کامل

Comparison of Neural Network Models, Vector Auto Regression (VAR), Bayesian Vector-Autoregressive (BVAR), Generalized Auto Regressive Conditional Heteroskedasticity (GARCH) Process and Time Series in Forecasting Inflation in ‎Iran‎

‎This paper has two aims. The first is forecasting inflation in Iran using Macroeconomic variables data in Iran (Inflation rate, liquidity, GDP, prices of imported goods and exchange rates) , and the second is comparing the performance of forecasting vector auto regression (VAR), Bayesian Vector-Autoregressive (BVAR), GARCH, time series and neural network models by which Iran's inflation is for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997